In numerous research-synthesis situations one can apply standard meta-analytic procedures to estimates of an effect size but is more interested in some function of this effect size. Examples include inverse variance-stabilizing transformations, partial correlations or regression coefficients and other functions of a correlation matrix, and odds ratios and other functions of a diagnostic test's sensitivity and specificity. I will describe frequentist estimation and inference techniques for various (possibly vector-valued) nonlinear functions of heterogeneous multivariate effect sizes, with particular attention to the function's inter-study mean and (co)variance (matrix). Hide
Meta-analysis for functions of heterogeneous multivariate effect sizes
Author:
Hafdahl, A. R.
Year:
2009 Source: Unpublished master's thesis, Washington University in St. Louis
Meta-analysis for indirect effects: Proposed approach and Monte Carlo study.
Author:
Hafdahl, A. R., & Carroll, I.
Year:
Source: Paper presented at the meeting of the Society for Research Synthesis Methodology, Providence, RI.
An indirect effect (IE) quantifies an intervening variable's role in the association between two other variables. In research syntheses the IE parameter may exhibit between-studies heterogeneity, and features of this parameter's distribution over studies may be of interest. Building on earlier developments in multivariate random-effects meta-analysis for correlation matrices, we propose a frequentist approach to estimation of and inference about the IE parameter's mean and variance for a simple model of mediation involving three continuous variables: An exogenous variable (X) affects a mediator (M), which in turn affects an outcome (Y). This approach hinges on treating an IE parameter as a function of a correlation-matrix parameter, so the latter's distribution implies the former's. Namely, we first estimate the between-studies mean and covariance-component matrix for correlations in a mediation model (e.g., X, M, and Y), then estimate the IE parameter's mean and variance using a low-order polynomial approximation, and finally obtain an asymptotic variance for the IE parameter's mean via the delta method. We describe several variants of this approach -- Pearson-r vs. Fisher-z correlations, full vs. diagonal covariance-component matrix, 1st- vs. 2nd-order polynomial -- and illustrate them with previously published data on anxiety, self-confidence, and sport performance. In a Monte Carlo study of these variants, we varied the correlation-matrix distribution and the number and sizes of studies; some variants performed notably better than others, and estimators of the IE parameter's mean performed better than estimators of its variance. We indicate directions for fruitful improvements upon and extensions of the proposed approach. Hide
Testing complex correlational hypotheses with structural equation models.
It is often of interest to estimate partial or semipartial correlation coefficients as indexes of the linear association between 2 variables after partialing one or both for the influence of covariates. Squaring these coefficients expresses the proportion of variance in 1 variable explained by the other variable after controlling for covariates. Methods exist for testing hypotheses about the equality of these coefficients across 2 or more groups, but they are difficult to conduct by hand, prone to error, and limited to simple cases. A unified framework is provided for estimating bivariate, partial, and semipartial correlation coefficients using structural equation modeling (SEM). Within the SEM framework, it is straightforward to test hypotheses of the equality of various correlation coefficients with any number of covariates across multiple groups. LISREL syntax is provided, along with 4 examples. Hide
Meta-analysis for functions of dependent correlations.
Author:
Hafdahl, A. R.
Year:
Source: In A. R. Hafdahl (Chair), Advances in meta-analysis for multivariable linear models. Invited symposium presented at the meeting of the Association for Psychological Science, San Francisco, CA.
Many meta-analysts who work with correlations among several variables are interested in some function of dependent correlations, such as partial or (squared) multiple correlations, regression or path coefficients, or combinations of these quantities. This presentation will cover various meta-analytic techniques for such functions, including illustrations using real data. These procedures comprise both fixed- and random-effects methods, with emphasis on the latter for heterogeneous correlation matrices, as well as inference via large-sample and bootstrap strategies. Hide
Meta-analysis for functions of heterogeneous correlation matrices.
Author:
Hafdahl, A. R.
Year:
Source: Paper presented at the annual meeting of the Psychometric Society, Durham, NH.
Correlation analyses in primary research often focus on certain substantively interesting functions of the correlations among several variables, such as partial, (squared) multiple, or canonical correlations; coefficients in regression, path, factor, or covariance-structure models; eigenvalues; or various combinations of these (e.g., contrasts, ratios). Research synthesists rarely meta-analyze such functions, however, perhaps due to lack of principled methods. In this paper I extend previous work on simpler cases (e.g., 1 study, 1 focal correlation, fixed-effects models) to meta-analytic estimation and inference for popular functions of correlations -- possibly vector-valued -- with particular attention to the case of between-studies heterogeneity. Standard techniques with recent refinements are used to estimate the correlation-matrix parameters' expectation and covariance matrix in the Pearson-r or Fisher-z metric, and an integral transformation of the implied parameter distribution yields the function's mean and covariance matrix. Proposed methods for inference about the function's mean include a delta-method covariance matrix as well as parametric and nonparametric bootstrapping. These approaches are contrasted with two related strategies that entail applying the focal function to either the observed correlation matrices or the mean correlation matrix. Monte Carlo studies of the focal techniques are presented. Hide